Current Issue : July - September Volume : 2016 Issue Number : 3 Articles : 5 Articles
Virtual machines (VM) on a Cloud platform can be influenced by a variety of factors which can lead to decreased performance\nand downtime, affecting the reliability of the Cloud platform. Traditional anomaly detection algorithms and strategies for Cloud\nplatforms have some flaws in their accuracy of detection, detection speed, and adaptability. In this paper, a dynamic and adaptive\nanomaly detection algorithm based on Self-Organizing Maps (SOM) for virtual machines is proposed. A unified modeling method\nbased on SOM to detect the machine performance within the detection region is presented, which avoids the cost of modeling\na single virtual machine and enhances the detection speed and reliability of large-scale virtual machines in Cloud platform. The\nimportant parameters that affect the modeling speed are optimized in the SOM process to significantly improve the accuracy of the\nSOM modeling and therefore the anomaly detection accuracy of the virtual machine....
Computers and other digital communication media have replaced paper\nand pencil from the writer�s desk. This development has confronted special\ncollections with a problem, as digital papers are difficult to process using\nestablished digital preservation strategies, because of their individual and\nunique nature. According to the proposal suggested in this paper, the creators\nshould instead be involved in the preservation process, and special\ncollections should integrate pre-custodial forms of curation within their\nrange of tasks. The article outlines the use of a cloud architecture as a suitable\ninstrument for accomplishing this task. The benefits and the prospects\nfor such a collection cloud are exposed and discussed....
With the rapid development of the cloud computing technology, it has matured enough for a lot of\nindividuals and organizations to move their work into the cloud. Correspondingly, a variety of\ncloud services are emerging. It is a key issue to assess the cloud services in order to help the cloud\nusers select the most suitable cloud service and the cloud providers offer this service with the\nhighest quality. The criteria parameters defining the cloud services are complex which lead to\ncloud service deviation. In this paper, we propose an assessment method of parameters importance\nin cloud services using rough set theory. The method can effectively compute the importance\nof cloud services parameters and sort them. On the one hand, the calculation can be used as\nthe credible reference when users choose their appropriate cloud services. On the other hand, it\ncan help cloud service providers to meet user requirements and enhance the user experience. The\nsimulation results show the effectiveness of the method and its relevance in the cloud context....
The evaluation of human environment risk is lacking quantitative data, while the qualitative knowledge cannot be easily quantified\nand synthesized. Furthermore, sometimes the experts are not well acknowledged with the whole indicator system or cannot reach\nan agreement on the comments. The conventional evaluation methods are not competent to solve the above aporia effectively.\nThus the quantization of the human environment risk becomes a conundrum. The compatibility cloud model theory can set up a\nconversion model between the qualitative knowledge and quantitative value, which provides technique approaches to evaluating\nthe risk of human environment. However, the hesitant opinion of experts stemming from the missing knowledge of the whole\nsystem or the branching opinions cannot be well solved by the traditional compatibility cloud model theory. Therefore, this paper\nbrings in the theory of hesitant fuzzy set, combining with the cloud model theory, to try to construct a hesitant cloud model to\nachieve the quantitative assessment of human environment risk. And at last an experiment evaluation on the risk of maritime silk\nroad is carried out....
Data compression is an area that needs to be given almost attention in text quality assessment. Different methodologies have been defined for this purpose. Hence choosing the best machine learning algorithm is really important. In addition to different compression technologies and methodologies, selection of a good data compression tool is most important. There is a complete range of different data compression techniques available both online and offline working such that it becomes really difficult to choose which technique serves the best. Here comes the necessity of choosing the right method for text compression purposes and hence an algorithm that can reveal the best tool among the given ones. A data compression algorithm is to be developed which consumes less time while provides more compression ratio as compared to existing techniques. In this paper we represent a hybrid approach to compress the text data. This hybrid approach is combination of Dynamic Bit reduction method and Huffman coding....
Loading....